Refactoring acoustic models using variational expectation-maximization

نویسندگان

  • Pierre L. Dognin
  • John R. Hershey
  • Vaibhava Goel
  • Peder A. Olsen
چکیده

In probabilistic modeling, it is often useful to change the structure, or refactor, a model, so that it has a different number of components, different parameter sharing, or other constraints. For example, we may wish to find a Gaussian mixture model (GMM) with fewer components that best approximates a reference model. Maximizing the likelihood of the refactored model under the reference model is equivalent to minimizing their KL divergence. For GMMs, this optimization is not analytically tractable. However, a lower bound to the likelihood can be maximized using a variational expectation-maximization algorithm. Automatic speech recognition provides a good framework to test the validity of such methods, because we can train reference models of any given size for comparison with refactored models. We show that we can efficiently reduce model size by 50%, with the same recognition performance as the corresponding model trained from data.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Restructuring exponential family mixture models

Variational KL (varKL) divergence minimization was previously applied to restructuring acoustic models (AMs) using Gaussian mixture models by reducing their size while preserving their accuracy. In this paper, we derive a related varKL for exponential family mixture models (EMMs) and test its accuracy using the weighted local maximum likelihood agglomerative clustering technique. Minimizing var...

متن کامل

Algorithmic improvements for variational inference

Variational methods for approximate inference in machine learning often adapt a parametric probability distribution to optimize a given objective function. This view is especially useful when applying variational Bayes (VB) to models outside the conjugate-exponential family. For them, variational Bayesian expectation maximization (VB EM) algorithms are not easily available, and gradient-based m...

متن کامل

Streaming Variational Inference for Dirichlet Process Mixtures

Bayesian nonparametric models are theoretically suitable to learn streaming data due to their complexity relaxation to the volume of observed data. However, most of the existing variational inference algorithms are not applicable to streaming applications since they require truncation on variational distributions. In this paper, we present two truncation-free variational algorithms, one for mix...

متن کامل

Segmentation of colour images using variational expectation-maximization algorithm

The approach proposed in this paper takes into account the uncertainty in colour modelling by employing variational Bayesian estimation. Mixtures of Gaussians are considered for modelling colour images. Distributions of parameters characterising colour regions are inferred from data statistics. The Variational Expectation-Maximization (VEM) algorithm is used for estimating the hyperparameters c...

متن کامل

Finding hypergraph communities: a Bayesian approach and variational solution

Data clustering, including problems such as finding network communities, can be put into a systematic framework by means of a Bayesian approach. Here we address the Bayesian formulation of the problem of finding hypergraph communities. We start by introducing a hypergraph generative model with a built-in group structure. Using a variational calculation we derive a variational Bayes algorithm, a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009